31 research outputs found

    Multiplicative versus additive noise in multi-state neural networks

    Full text link
    The effects of a variable amount of random dilution of the synaptic couplings in Q-Ising multi-state neural networks with Hebbian learning are examined. A fraction of the couplings is explicitly allowed to be anti-Hebbian. Random dilution represents the dying or pruning of synapses and, hence, a static disruption of the learning process which can be considered as a form of multiplicative noise in the learning rule. Both parallel and sequential updating of the neurons can be treated. Symmetric dilution in the statics of the network is studied using the mean-field theory approach of statistical mechanics. General dilution, including asymmetric pruning of the couplings, is examined using the generating functional (path integral) approach of disordered systems. It is shown that random dilution acts as additive gaussian noise in the Hebbian learning rule with a mean zero and a variance depending on the connectivity of the network and on the symmetry. Furthermore, a scaling factor appears that essentially measures the average amount of anti-Hebbian couplings.Comment: 15 pages, 5 figures, to appear in the proceedings of the Conference on Noise in Complex Systems and Stochastic Dynamics II (SPIE International

    A spherical Hopfield model

    Full text link
    We introduce a spherical Hopfield-type neural network involving neurons and patterns that are continuous variables. We study both the thermodynamics and dynamics of this model. In order to have a retrieval phase a quartic term is added to the Hamiltonian. The thermodynamics of the model is exactly solvable and the results are replica symmetric. A Langevin dynamics leads to a closed set of equations for the order parameters and effective correlation and response function typical for neural networks. The stationary limit corresponds to the thermodynamic results. Numerical calculations illustrate our findings.Comment: 9 pages Latex including 3 eps figures, Addition of an author in the HTML-abstract unintentionally forgotten, no changes to the manuscrip

    The Blume-Emery-Griffiths neural network: dynamics for arbitrary temperature

    Full text link
    The parallel dynamics of the fully connected Blume-Emery-Griffiths neural network model is studied for arbitrary temperature. By employing a probabilistic signal-to-noise approach, a recursive scheme is found determining the time evolution of the distribution of the local fields and, hence, the evolution of the order parameters. A comparison of this approach is made with the generating functional method, allowing to calculate any physical relevant quantity as a function of time. Explicit analytic formula are given in both methods for the first few time steps of the dynamics. Up to the third time step the results are identical. Some arguments are presented why beyond the third time step the results differ for certain values of the model parameters. Furthermore, fixed-point equations are derived in the stationary limit. Numerical simulations confirm our theoretical findings.Comment: 26 pages in Latex, 8 eps figure

    An optimal Q-state neural network using mutual information

    Full text link
    Starting from the mutual information we present a method in order to find a hamiltonian for a fully connected neural network model with an arbitrary, finite number of neuron states, Q. For small initial correlations between the neurons and the patterns it leads to optimal retrieval performance. For binary neurons, Q=2, and biased patterns we recover the Hopfield model. For three-state neurons, Q=3, we find back the recently introduced Blume-Emery-Griffiths network hamiltonian. We derive its phase diagram and compare it with those of related three-state models. We find that the retrieval region is the largest.Comment: 8 pages, 1 figur

    Correlated patterns in non-monotonic graded-response perceptrons

    Full text link
    The optimal capacity of graded-response perceptrons storing biased and spatially correlated patterns with non-monotonic input-output relations is studied. It is shown that only the structure of the output patterns is important for the overall performance of the perceptrons.Comment: 4 pages, 4 figure

    Instability of frozen-in states in synchronous Hebbian neural networks

    Full text link
    The full dynamics of a synchronous recurrent neural network model with Ising binary units and a Hebbian learning rule with a finite self-interaction is studied in order to determine the stability to synaptic and stochastic noise of frozen-in states that appear in the absence of both kinds of noise. Both, the numerical simulation procedure of Eissfeller and Opper and a new alternative procedure that allows to follow the dynamics over larger time scales have been used in this work. It is shown that synaptic noise destabilizes the frozen-in states and yields either retrieval or paramagnetic states for not too large stochastic noise. The indications are that the same results may follow in the absence of synaptic noise, for low stochastic noise.Comment: 14 pages and 4 figures; accepted for publication in J. Phys. A: Math. Ge

    The Little-Hopfield model on a Random Graph

    Full text link
    We study the Hopfield model on a random graph in scaling regimes where the average number of connections per neuron is a finite number and where the spin dynamics is governed by a synchronous execution of the microscopic update rule (Little-Hopfield model).We solve this model within replica symmetry and by using bifurcation analysis we prove that the spin-glass/paramagnetic and the retrieval/paramagnetictransition lines of our phase diagram are identical to those of sequential dynamics.The first-order retrieval/spin-glass transition line follows by direct evaluation of our observables using population dynamics. Within the accuracy of numerical precision and for sufficiently small values of the connectivity parameter we find that this line coincides with the corresponding sequential one. Comparison with simulation experiments shows excellent agreement.Comment: 14 pages, 4 figure

    Symmetric sequence processing in a recurrent neural network model with a synchronous dynamics

    Full text link
    The synchronous dynamics and the stationary states of a recurrent attractor neural network model with competing synapses between symmetric sequence processing and Hebbian pattern reconstruction is studied in this work allowing for the presence of a self-interaction for each unit. Phase diagrams of stationary states are obtained exhibiting phases of retrieval, symmetric and period-two cyclic states as well as correlated and frozen-in states, in the absence of noise. The frozen-in states are destabilised by synaptic noise and well separated regions of correlated and cyclic states are obtained. Excitatory or inhibitory self-interactions yield enlarged phases of fixed-point or cyclic behaviour.Comment: Accepted for publication in Journal of Physics A: Mathematical and Theoretica

    Statistical mechanics and stability of a model eco-system

    Full text link
    We study a model ecosystem by means of dynamical techniques from disordered systems theory. The model describes a set of species subject to competitive interactions through a background of resources, which they feed upon. Additionally direct competitive or co-operative interaction between species may occur through a random coupling matrix. We compute the order parameters of the system in a fixed point regime, and identify the onset of instability and compute the phase diagram. We focus on the effects of variability of resources, direct interaction between species, co-operation pressure and dilution on the stability and the diversity of the ecosystem. It is shown that resources can be exploited optimally only in absence of co-operation pressure or direct interaction between species.Comment: 23 pages, 13 figures; text of paper modified, discussion extended, references adde
    corecore